Generating functions are a strange concept when you first come across them but they have many applications and often make otherwise complicated calculations much more straightforward
The probability generating function of a random variable $X$ is $G(t)=E \left(t^X \right)$
For a discrete random variable (on the integers) this is: $G(t)=\sum\limits_{x=-\infty}^{\infty} p(x) t^x$
For a continuous random variable this is: $G(t)=\int\limits_{-\infty}^{\infty} f(x) t^x dx$, where $f(x)$ is the probability density function.
What is the probability generating function which describes a dice roll?
$G(t)=\frac{1}{6}\left(t^1 + t^2 + t^3 + t^4 + t^5 + t^6 \right)$
Now calculate $\left(G(t)\right)^2$
Can you see the practical application that probability generating functions might have?
You can read off the co-efficients and they give the probability of each score when you add two dice rolls together
The moment generating function of a random variable $X$ is $M(t)=E \left(e^{tX} \right)$
For a continuous random variable this is: $M(t)=\int\limits_{-\infty}^{\infty} f(x) e^{tx} dx$, where $f(x)$ is the probability density function.
Calculate the moment generating function for the the normal distribution:
for $X\sim N(\mu, \sigma^2), M_X(t)=e^{t\mu+\frac{1}{2}\sigma^2t^2}$
Now differentiate $M_X(t)$ and find $\frac{dM_X(0)}{dt}$ and $\frac{d^2M_X(0)}{dt^2}$ again for $N(\mu ,\sigma ^2)$
$\frac{dM_X(0)}{dt} = \mu$
$\frac{d^2M_X(0)}{dt^2} = \mu^2 + \sigma^2$
proof (first moment)
$M_X(t)=e^{t\mu+\frac{1}{2}\sigma^2t^2}$
$\frac{dM_X(t)}{dt}=(\mu+\sigma^2 t) \times e^{t\mu+\frac{1}{2}\sigma^2t^2}$
$\frac{dM_X(0)}{dt}=(\mu+0) \times e^{0} = \mu$
proof (second moment)
$\frac{d^2M_X(t)}{dt^2}=(\mu+\sigma^2 t)\times (\mu+\sigma^2 t) \times e^{t\mu+\frac{1}{2}\sigma^2t^2} + \sigma^2 \times e^{t\mu+\frac{1}{2}\sigma^2t^2}$
$\frac{d^2M_X(0)}{dt^2}=(\mu+0)\times (\mu+0) \times e^{0} + \sigma^2 \times e^{0}$
$\frac{d^2M_X(0)}{dt^2}=\mu^2 + \sigma^2$
The characteristic function is very closely related to the moment generating function in that it is defined as
$ φ_X(t)=E\left(e^{itX} \right)$
It is the Fourier transform of the probability density function and as such is particularly important
The cumulant generating function of a random variable $X$ is $K(\theta)=log E \left(e^{\theta X} \right)$
We can see that for $X\sim N(\mu, \sigma^2), K_X(\theta)=\theta\mu+\frac{1}{2}\sigma^2\theta^2$
So $K'(\theta)=\mu + \sigma^2 \theta$, $K''(\theta)=\sigma^2$ and $K'''(\theta)=0$
The cumulants are $k_1=\mu$, $k_2=\sigma^2$, $k_n=0$ for $n \geqslant 3$
How do you think the cumulants are defined
$k_n=\frac{d^nK(0)}{d\theta^n}$
This allows us to connect the cumulant generating functions with a Maclaurin expansion and write:
$K(\theta)=\frac{\kappa_1 \theta^1}{1!} + \frac{\kappa_2 \theta^2}{2!} + \frac{\kappa_3 \theta^3}{3!} + \frac{\kappa_4 \theta^4}{4!} + ...$
The dummy variable $t$ has been replaced by $\theta$ in the above to avoid confusion with time